5 research outputs found

    MIDGARD: A Simulation Platform for Autonomous Navigation in Unstructured Environments

    Full text link
    We present MIDGARD, an open-source simulation platform for autonomous robot navigation in outdoor unstructured environments. MIDGARD is designed to enable the training of autonomous agents (e.g., unmanned ground vehicles) in photorealistic 3D environments, and to support the generalization skills of learning-based agents through the variability in training scenarios. MIDGARD's main features include a configurable, extensible, and difficulty-driven procedural landscape generation pipeline, with fast and photorealistic scene rendering based on Unreal Engine. Additionally, MIDGARD has built-in support for OpenAI Gym, a programming interface for feature extension (e.g., integrating new types of sensors, customizing exposing internal simulation variables), and a variety of simulated agent sensors (e.g., RGB, depth and instance/semantic segmentation). We evaluate MIDGARD's capabilities as a benchmarking tool for robot navigation utilizing a set of state-of-the-art reinforcement learning algorithms. The results demonstrate MIDGARD's suitability as a simulation and training environment, as well as the effectiveness of our procedural generation approach in controlling scene difficulty, which directly reflects on accuracy metrics. MIDGARD build, source code and documentation are available at https://midgardsim.org/

    Intuitive Robot Teleoperation through Multi-Sensor Informed Mixed Reality Visual Aids

    Get PDF
    © 2021 The Author(s). This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.Mobile robotic systems have evolved to include sensors capable of truthfully describing robot status and operating environment as accurately and reliably as never before. This possibility is challenged by effective sensor data exploitation, because of the cognitive load an operator is exposed to, due to the large amount of data and time-dependency constraints. This paper addresses this challenge in remote-vehicle teleoperation by proposing an intuitive way to present sensor data to users by means of using mixed reality and visual aids within the user interface. We propose a method for organizing information presentation and a set of visual aids to facilitate visual communication of data in teleoperation control panels. The resulting sensor-information presentation appears coherent and intuitive, making it easier for an operator to catch and comprehend information meaning. This increases situational awareness and speeds up decision-making. Our method is implemented on a real mobile robotic system operating outdoor equipped with on-board internal and external sensors, GPS, and a reconstructed 3D graphical model provided by an assistant drone. Experimentation verified feasibility while intuitive and comprehensive visual communication was confirmed through a qualitative assessment, which encourages further developments.Peer reviewe

    An Intelligent Hierarchical Cyber-Physical System for Beach Waste Management: The BIOBLU Case Study

    Get PDF
    Nestled at the confluence of nature grandeur and human civilization, beaches command an influential presence that resonates throughout the environment, society, and culture. However, climate change and pollution overhang the beach health and need to be properly dealt with. Proactive measures involve education, responsible waste management, sustainable infrastructure, and environmental regulations, while reactive ones focus on immediate response and cleanup efforts. Nevertheless, continuous monitoring and cleaning are challenging due to various factors such as beach characteristics, hidden waste, weather conditions and, consequently, high costs. To overcome such challenges, this paper proposes an autonomous system for beach cleaning adopting an Intelligent Hierarchical Cyber-Physical System (IHCPS) approach and Information and Communication Technologies. The proposed beach waste management (BeWastMan) solution integrates an Unmanned Aerial Vehicle for the beach aerial surveillance and monitoring, a ground station for data processing, and an Unmanned Ground Vehicle to collect and sort waste autonomously. The research findings contribute to the development of innovative and fully automated approaches in beach waste management, and demonstrate the feasibility and effectiveness of the BeWastMan IHCPS by a real case study, developed in the frame of the BIOBLU project
    corecore